Goto

Collaborating Authors

 national oceanic and atmospheric administration



Climate Change Made Hurricane Melissa 4 Times More Likely, Study Suggests

WIRED

Unusually warm ocean temperatures fueled one of the worst hurricanes on record. New research finds climate change increased the storm's likelihood. Fueled by unusually warm waters, Hurricane Melissa this week turned into one of the strongest Atlantic storms ever recorded. Now a new rapid attribution study suggests human-induced climate change made the deadly tropical cyclone four times more likely. The storm, which reached Category 5, reserved for the hurricanes with the most powerful winds, has killed at least 40 people across the Caribbean so far.



Evaluation of Machine and Deep Learning Techniques for Cyclone Trajectory Regression and Status Classification by Time Series Data

Lo, Ethan Zachary, Lo, Dan Chie-Tien

arXiv.org Artificial Intelligence

Abstract--Accurate cyclone forecasting is essential for minimizing loss of life, infrastructure damage, and economic disruption. Traditional numerical weather prediction models, though effective, are computationally intensive and prone to error due to the chaotic nature of atmospheric systems. This study proposes a machine learning (ML) approach to forecasting tropical cyclone trajectory and status using time series data from the National Hurricane Center, including recently added best track wind radii. A two-stage ML pipeline is developed: a regression model first predicts cyclone features--maximum wind speed, minimum pressure, trajectory length, and directional change--using a sliding window of historical data. These outputs are then input into classification models to predict the cyclone's categorical status. Gradient boosting regression and three classifiers--random forest (RF), support vector machine (SVM), and multi-layer perceptron (MLP)--are evaluated. After hyperparameter tuning and synthetic minority oversampling (SMOTE), the RF classifier achieves the highest performance with 93% accuracy, outperforming SVM and MLP across precision, recall, and F1 score. The RF model is particularly robust in identifying minority cyclone statuses and minimizing false negatives. Regression results yield low mean absolute errors, with pressure and wind predictions within 2.2 mb and 2.4 kt, respectively. These findings demonstrate that ML models, especially ensemble-based classifiers, offer an effective, scalable alternative to traditional forecasting methods, with potential for real-time cyclone prediction and integration into decision-support systems.


Transmission Line Outage Probability Prediction Under Extreme Events Using Peter-Clark Bayesian Structural Learning

Chen, Xiaolin, Huang, Qiuhua, Zhou, Yuqi

arXiv.org Artificial Intelligence

Recent years have seen a notable increase in the frequency and intensity of extreme weather events. With a rising number of power outages caused by these events, accurate prediction of power line outages is essential for safe and reliable operation of power grids. The Bayesian network is a probabilistic model that is very effective for predicting line outages under weather-related uncertainties. However, most existing studies in this area offer general risk assessments, but fall short of providing specific outage probabilities. In this work, we introduce a novel approach for predicting transmission line outage probabilities using a Bayesian network combined with Peter-Clark (PC) structural learning. Our approach not only enables precise outage probability calculations, but also demonstrates better scalability and robust performance, even with limited data. Case studies using data from BPA and NOAA show the effectiveness of this approach, while comparisons with several existing methods further highlight its advantages.


Scientists identify eerie new species of deep-sea jellyfish using only high-definition video

Daily Mail - Science & tech

Researchers have identified an eerie new form of deep-sea creature using only high-definition video. In 2015, scientists with the National Oceanic and Atmospheric Administration (NOAA) piloted a remotely operated vehicle through an underwater canyon off the coast of Puerto Rico. At a depth of more than two miles the drone came across a ctenophore, or comb jelly, unlike any other species researchers had encountered. It was rectangular, had two long tentacles and moved as if it was anchored to the seafloor. Unable to take a specimen, researchers used the footage to develop an anatomical diagram of the gelatinous blob, which they've named Duobrachium sparksae.


Antarctic waters: Warmer with more acidity and less oxygen

#artificialintelligence

The increased freshwater from melting Antarctic ice sheets plus increased wind has reduced the amount of oxygen in the Southern Ocean and made it more acidic and warmer, according to new research led by University of Arizona geoscientists. The researchers found Southern Ocean waters had changed by comparing shipboard measurements taken from 1990 to 2004 with measurements taken by a fleet of microsensor-equipped robot floats from 2012 to 2019. The observed oxygen loss and warming around the Antarctic coast is much larger than predicted by a climate model, which could have implications for predictions of ice melt. The discovery drove the research team to improve current climate change computer models to better reflect the environmental changes around Antarctica. "It's the first time we've been able to reproduce the new changes in the Southern Ocean with an Earth system model," said co-author Joellen Russell, a professor of geosciences. The research is the first to incorporate the Southern Ocean's increased freshwater plus additional wind into a climate change model, she said.


AI Helps Cities Predict Natural Disasters

WSJ.com: WSJD - Technology

Give artificial intelligence some of the credit. Hydro One used an electrical-outage prediction tool developed by International Business Machines Corp. IBM -1.14% that combines AI technology and the resources of IBM's Weather Co. subsidiary. The tool helped predict the severity of the storm and the locations that would be hardest hit, so Hydro One knew where to position 1,400 front-line staff who were needed to restore power and to handle the nearly 130,000 customer calls that came in during the outage. IBM's outage-prediction tool is also being used, with 70% accuracy, by other cities throughout North America to predict power outages as far in advance as 72 hours before storms are expected.


Global warming in Alaska tricked computer to DELETE data

Daily Mail - Science & tech

Temperatures in the Arctic have been rising so fast in recent decades they have confused a computer designed to measure them. Scientists monitoring a site in Alaska have found that an algorithm at the weather station, which has been recording temperatures for nearly 100 years, deleted all of its data from 2017, and even some from 2016. In what the experts are now calling an'ironic exclamation point' to rapid climate change, the algorithm flagged the abnormal temperatures observed at the station, as it assumed they were too high to be accurate. When scientists set out at the beginning of December to review the previous month's climate data, they noticed something'odd': everything from Utqiaġvik, Alaska was missing. The data from 2017 and some of 2016 had been flagged as artificial.


What would make a computer biased? Learning a language spoken by humans

#artificialintelligence

One of the amazing (and scary) things about artificial intelligence programs is that in learning to mimic their human masters so perfectly, these wonders of computer software hold up a mirror to patterns of behavior we engage in every day but may not even notice. Beyond their extraordinary usefulness in industry, medicine and communications, these "learning" programs can lay bare the mental shortcuts we humans use to make sense of our world. Indeed, new research with artificial intelligence programs highlights the ethnic and gender biases of English speakers. In a first-of-its-kind effort, a group of Princeton University computer scientists set a widely used artificial intelligence program to the task of learning English by performing a massive "crawl" of the World Wide Web. After gobbling up some 840 billion words, the software developed a vocabulary of 2.2 million distinct words, and the fluency to use them in ways that were grammatically correct.

  artificial intelligence program, gender, national oceanic and atmospheric administration, (12 more...)
  Country:
  Industry: